Tweet
Login
Mathematics Crystal
You may switch between
tex
and
pdf
by changing the end of the URL.
Home
About Us
Materials
Site Map
Questions and Answers
Skills
Topic Notes
HSC
Integration
Others
Tangent
UBC
UNSW
Calculus Advanced
Challenges
Complex Numbers
Conics
Differentiation
Integration
Linear Algebra
Mathematical Induction
Motion
Others
Polynomial Functions
Probability
Sequences and Series
Trigonometry
/
Topics /
Linear Algebra /
Similarity.tex
--Quick Links--
The Number Empire
Wolfram Mathematica online integrator
FooPlot
Calc Matthen
Walter Zorn
Quick Math
Lists of integrals
List of integrals of trigonometric functions
PDF
\documentclass[10pt]{article} \usepackage{amssymb,amsmath} \usepackage[hmargin=1cm,vmargin=0.5cm]{geometry} \begin{document} {\large Similarity and Diagonalisation}\\ \begin{align*} \text{\bf Simil}&\text{\bf arity:}\\ &\text{Given two matrices $A$ and $B$, if there is an invertible matrix $S$ such that $\boxed{A=SBS^{-1}}$ , $A$ and $B$ are similar.}\\ &\text{It follows that $AS=SB$, which means that $S$ transforming $B$ gives the same result as $A$ transforming $S$.}\\ \\ &\text{In real numbers, $as=sb$ means $a=b$. We cannot same the same for matrices as their multiplication is not}\\ &\text{commutative, but $AS=SB$ hints that $A$ and $B$ while not equal would share some common properties.}\\ &\text{e.g. Their determinants are equal, so they are both invertable or both not; They are both diagonalisable or}\\ &\text{both not; Their eigenvalues are the same (with different eigenspaces connected through $S$).}\\ \\ &\text{If $A\sim B$, $A=SBS^{-1},$ $B=S^{-1}AS=(S^{-1})A(S^{-1})^{-1}$, then $B\sim A$ (commutative law).}\\ &\text{If $A\sim B$ and $B\sim C$, $AS_1=S_1B$, $BS_2=S_2C$, $A(S_1S_2)=S_1BS_2=S_1S_2C=(S_1S_2)C$, then $A\sim C$ (transitive law).}\\ &\text{Because $A=IAI^{-1}$, $A\sim A$ (reflexive law).}\\ \\ &\text{If $\lambda$ is an eigenvalue of $A$, then }\lambda I-A=\lambda I-SBS^{-1}=\lambda SIS^{-1}-SBS^{-1}=S(\lambda I-B)S^{-1}.\\ &\therefore\det(\lambda I-B)=\det(S(\lambda I-B)S^{-1})=\det(\lambda I-A)=0.\quad\text{So $\lambda$ is an eigenvalue of $B$, and vice versa.}\\ \\ &\text{If $\mathbf{v}$ is an eigenvector of $B$, } B\mathbf{v}=\lambda\mathbf{v},~~ AS\mathbf{v}=SB\mathbf{v}=\lambda S\mathbf{v},~~ A(S\mathbf{v})=\lambda(S\mathbf{v}),~~ \text{i.e. $S\mathbf{v}$ is an eigenvector}\\ &\text{\qquad of $A$, and vice versa.}\\ \\ \text{\bf Diago}&\text{\bf nisation:}\quad\text{If a matrix $A$ is similar to a diagonal matrix $D$, then $A$ is said to be diagnoisable. Since matrix}\\ &\text{similarity is transitive, all matrices similar to a diagonal matrix $D$ form a ``family'' of mutually similar matrices.}\\ &\text{Let }D= \begin{bmatrix} d_1 & 0 & \ldots & 0 \\ 0 & d_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & d_n \end{bmatrix},\quad \lambda I-D= \begin{bmatrix} \lambda-d_1 & 0 & \ldots & 0 \\ 0 & \lambda-d_2 & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & \lambda-d_n \end{bmatrix},\quad \det(\lambda I-D)=\prod_{k=1}^n(\lambda-d_k).\\ &\text{It is obvious that any $d_r$ ($r=1,2,\ldots,n$) are eigenvalues of $D$, as $\det(d_r I-D)=\prod_{k=1}^n(d_r-d_k)=0$.}\\ \\ &\text{If $A$ is an invertible matrix ($\det(A)\neq 0$) and is similar to $D$, the eigenvalues of $A$ are $d_r$ and they are all distinct.}\\ &\text{Let $P=\big[\mathbf{v_1}\mathbf{v_2}\ldots\mathbf{v_n}\big]$ where $\mathbf{v_r}$ are $A$'s eigenvectors, which must be mutually independent. i.e. $\det(P)\neq 0$ and}\\ &\text{$P^{-1}$ exists, and $A=PDP^{-1}$. $A\sim D$ means $A=SDS^{-1}$, which means $S=P$ and its columns are eigenvectors.}\\ \\ &\text{You may take similarity as ``having the same eigenvalues''. If $A$ and $B$ have the same eigenvalue matrix $D$,}\\ &\text{$A=P_A^{}DP_A^{-1}$ and $B=P_B^{}DP_B^{-1}$, and there is a matrix $S$ such that $P_A=SP_B$, then $P_A^{-1}=P_B^{-1}S^{-1}$ and}\\ &\text{$A=P_A^{}DP_A^{-1}=(SP_B^{})D(P_B^{-1}S^{-1})=S(P_B^{}DP_B^{-1})S^{-1}=SBS^{-1}$. After all, $A$ and $B$ have the same characteristic}\\ &\text{polynomial because $\det(xI-A)=\det(xI-B)$. The roots of the polynomial are eigenvalues of both $A$ and $B$. So}\\ &\text{they must have the same eigenvalues.}\\ \\ &\text{Not all matrices are diagonalisable. But if a matrix has $n$ eigenvectors (including identical ones), it is diagonalisable.}\\ &\text{In fact, as long as $P$ is invertible (i.e. there are $n$ mutually independent eigenvectors, $A$ does not need to be}\\ &\text{invertible (i.e. eigenvalues distinct). If there are identical eigenvalues, $D$ is not invertible, neither is $A$, but it may}\\ &\text{still have $n$ mutually independent eigenvectors, so $P^{-1}$ exists and therefore is diagonalisable (i.e. similar to $D$).}\\ \end{align*} \end{document}